Joint Bayesian Variable and DAG Selection Consistency for High-dimensional Regression Models with Network-structured Covariates

نویسندگان

چکیده

We consider the joint sparse estimation of regression coefficients and covariance matrix for covariates in a high-dimensional model, where predictors are both relevant to response variable interest functionally related one another via Gaussian directed acyclic graph (DAG) model. DAG models introduce sparsity Cholesky factor inverse matrix, pattern turn corresponds specific conditional independence assumptions on underlying predictors. A variety methods have been developed recent years Bayesian inference identifying such network-structured setting, yet crucial selection properties these not thoroughly investigated. In this paper, we hierarchical model with spike slab priors flexible general class DAG-Wishart distributions multiple shape parameters factors matrix. Under mild regularity assumptions, establish consistency when dimension is allowed grow much larger than sample size. demonstrate that our method outperforms existing selecting several simulation settings.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Joint Bayesian variable and graph selection for regression models with network-structured predictors.

In this work, we develop a Bayesian approach to perform selection of predictors that are linked within a network. We achieve this by combining a sparse regression model relating the predictors to a response variable with a graphical model describing conditional dependencies among the predictors. The proposed method is well-suited for genomic applications because it allows the identification of ...

متن کامل

On the Consistency of Bayesian Variable Selection for High Dimensional Binary Regression and Classification

Modern data mining and bioinformatics have presented an important playground for statistical learning techniques, where the number of input variables is possibly much larger than the sample size of the training data. In supervised learning, logistic regression or probit regression can be used to model a binary output and form perceptron classification rules based on Bayesian inference. We use a...

متن کامل

A Consistent Variable Selection Criterion for Linear Models with High-dimensional Covariates

We consider the variable selection problem in regression models when the number of covariates is allowed to increase with the sample size. An approach of Zheng and Loh (1995) for the fixed design situation is extended to the case of random covariates. This yields a unified consistent selection criterion for both random and fixed covariates. By using t-statistics to order the covariates, the met...

متن کامل

Semiparametric Quantile Regression with High-dimensional Covariates.

This paper is concerned with quantile regression for a semiparametric regression model, in which both the conditional mean and conditional variance function of the response given the covariates admit a single-index structure. This semiparametric regression model enables us to reduce the dimension of the covariates and simultaneously retains the flexibility of nonparametric regression. Under mil...

متن کامل

Combining a relaxed EM algorithm with Occam's razor for Bayesian variable selection in high-dimensional regression

We address the problem of Bayesian variable selection for high-dimensional linear regression. We consider a generative model that uses a spike-and-slab-like prior distribution obtained by multiplying a deterministic binary vector, which traduces the sparsity of the problem, with a random Gaussian parameter vector. The originality of the work is to consider inference through relaxing the model a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Statistica Sinica

سال: 2021

ISSN: ['1017-0405', '1996-8507']

DOI: https://doi.org/10.5705/ss.202019.0202